Pattern Analysis and Synthesis in Attractor Neural Networks
نویسندگان
چکیده
The representation of hidden variable models by attractor neural net works is studied Memories are stored in a dynamical attractor that is a continuous manifold of xed points as illustrated by linear and nonlinear networks with hidden neurons Pattern analysis and synthesis are forms of pattern completion by recall of a stored memory Analysis and synthesis in the linear network are performed by bottom up and top down connec tions In the nonlinear network the analysis computation additionally requires recti cation nonlinearity and inner product inhibition between hidden neurons One popular approach to sensory processing is based on generative models which assume that sensory input patterns are synthesized from some underlying hidden variables For example the sounds of speech can be synthesized from a sequence of phonemes and images of a face can be synthesized from pose and lighting variables Hidden variables are useful because they constitute a simpler representation of the variables that are visible in the sensory input Using a generative model for sensory processing requires a method of pattern analysis Given a sensory input pattern analysis is the recovery of the hidden variables from which it was synthesized In other words analysis and synthesis are inverses of each other There are a number of approaches to pattern analy sis In analysis by synthesis the synthetic model is embedded inside a negative feedback loop Another approach is to construct a separate analysis model This paper explores a third approach in which visible hidden pairs are em bedded as attractive xed points or attractors in the state space of a recurrent neural network The attractors can be regarded as memories stored in the net work and analysis and synthesis as forms of pattern completion by recall of a memory The approach is illustrated with linear and nonlinear network ar chitectures In both networks the synthetic model is linear as in principal
منابع مشابه
بهبود بازشناسی مقاوم الگو در شبکه های عصبی بازگشتی جاذب از طریق به کارگیری دینامیک های آشوب گونه
In this paper, two kinds of chaotic neural networks are proposed to evaluate the efficiency of chaotic dynamics in robust pattern recognition. The First model is designed based on natural selection theory. In this model, attractor recurrent neural network, intelligently, guides the evaluation of chaotic nodes in order to obtain the best solution. In the second model, a different structure of ch...
متن کاملAttractor Density Models with Application to Analyzing the Stability of Biological Neural Networks
An attractor modeling algorithm is introduced which draws upon techniques found in nonlineax dynamics and pattern recognition. The technique is motivated. by the need for quantitative measures that are able to assess the stability of biological neural networks which utilize nonlinear dynamics to process information. Attractor Density Models with Application to Analyzing the Stability of Biologi...
متن کاملEffects of Fast Presynaptic Noise in Attractor Neural Networks
We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short timescale compared to that for the neuron dynamics and it produces short-time synaptic depression. This is inspired in recent neurobiological findings that show that synaptic strength may either increase or decrease on a sho...
متن کاملDistribution Systems Reconfiguration Using Pattern Recognizer Neural Networks
A novel intelligent neural optimizer with two objective functions is designed for electrical distribution systems. The presented method is faster than alternative optimization methods and is comparable with the most powerful and precise ones. This optimizer is much smaller than similar neural systems. In this work, two intelligent estimators are designed, a load flow program is coded, and a spe...
متن کاملA Scalable Architecture for Binary Couplings Attractor Neural Networks
This paper presents a digital architecture with on-chip learning for Hoppeld attractor neural networks with binary weights. A new learning rule for the binary weights network is proposed that allows pattern storage up to capacity = 0:4 and incurs very low hardware overhead. Due to the use of binary couplings the network has minimal storage requirements. A exible communication structure allows t...
متن کامل